inference system - significado y definición. Qué es inference system
Diclib.com
Diccionario ChatGPT
Ingrese una palabra o frase en cualquier idioma 👆
Idioma:

Traducción y análisis de palabras por inteligencia artificial ChatGPT

En esta página puede obtener un análisis detallado de una palabra o frase, producido utilizando la mejor tecnología de inteligencia artificial hasta la fecha:

  • cómo se usa la palabra
  • frecuencia de uso
  • se utiliza con más frecuencia en el habla oral o escrita
  • opciones de traducción
  • ejemplos de uso (varias frases con traducción)
  • etimología

Qué (quién) es inference system - definición

COMPONENT OF THE SYSTEM THAT APPLIES LOGICAL RULES TO THE KNOWLEDGE BASE TO DEDUCE NEW INFORMATION
Expert system shell; Inference system; Rule-based inference engine

inference engine         
A program that infers new facts from known facts using inference rules. Commonly found as part of a Prolog interpreter, expert system or knowledge based system. (1994-11-01)
Inference engine         
In the field of artificial intelligence, an inference engine is a component of the system that applies logical rules to the knowledge base to deduce new information. The first inference engines were components of expert systems.
Adaptive neuro fuzzy inference system         
Anfis; Adaptive Neuro Fuzzy Inference System; ANFIS; Adaptive-network-based fuzzy inference system; Adaptive network-based fuzzy inference system; Adaptive neuro-fuzzy inference system; Adaptive-neuro-fuzzy inference system
An adaptive neuro-fuzzy inference system or adaptive network-based fuzzy inference system (ANFIS) is a kind of artificial neural network that is based on Takagi–Sugeno fuzzy inference system. The technique was developed in the early 1990s.

Wikipedia

Inference engine

In the field of artificial intelligence, an inference engine is a component of the system that applies logical rules to the knowledge base to deduce new information. The first inference engines were components of expert systems. The typical expert system consisted of a knowledge base and an inference engine. The knowledge base stored facts about the world. The inference engine applies logical rules to the knowledge base and deduced new knowledge. This process would iterate as each new fact in the knowledge base could trigger additional rules in the inference engine. Inference engines work primarily in one of two modes either special rule or facts: forward chaining and backward chaining. Forward chaining starts with the known facts and asserts new facts. Backward chaining starts with goals, and works backward to determine what facts must be asserted so that the goals can be achieved.